Goto

Collaborating Authors

 Hillsborough County








Predictive Attractor Models

Neural Information Processing Systems

The task of sequential memory is considered challenging for models operating under biological constraints (i.e., local synaptic computations) for many reasons, including catastrophic forgetting,


Supplementary Material for Flat Seeking Bayesian Neural Networks Van-Anh Nguyen 1 Tung-Long Vuong

Neural Information Processing Systems

The proof can be found in Chapter 27 of [6]. For the non-flat version, the update is similar to the mini-batch SGD except that we add small Gaussian noises to the particle models. In Section 4.2 of the main paper, we provide a comprehensive analysis of the performance concerning In the experiments presented in Tables 1 and 2 in the main paper, we train all models for 300 epochs using SGD, with a learning rate of 0.1 and a cosine schedule. For the baseline of the Deep-Ensemble, SGLD, SGVB and SGVB-LRT methods, we reproduce results following the hyper-parameters and processes as our flat versions. ImageNet: This is a large and challenging dataset with 1000 classes.